منابع مشابه
Properties of the Topographic Product of Experts
In this paper, we show how a topographic mapping can be created from a product of experts. We learn the parameters of the mapping using gradient descent on the negative logarithm of the probability density function of the data under the model. We show that the mapping, though retaining its product of experts form, becomes more like a mixture of experts during training.
متن کاملBoosting as a Product of Experts
In this paper, we derive a novel probabilistic model of boosting as a Product of Experts. We re-derive the boosting algorithm as a greedy incremental model selection procedure which ensures that addition of new experts to the ensemble does not decrease the likelihood of the data. These learning rules lead to a generic boosting algorithm POEBoost which turns out to be similar to the AdaBoost alg...
متن کاملLearning a Product of Experts with Elitist Lasso
Discriminative models such as logistic regression profit from the ability to incorporate arbitrary rich features; however, complex dependencies among overlapping features can often result in weight undertraining. One popular method that attempts to mitigate this problem is logarithmic opinion pools (LOP), which is a specialized form of product of experts model that automatically adjusts the wei...
متن کاملLearning to Create Jazz Melodies Using a Product of Experts
We describe a neural network architecture designed to learn the musical structure of jazz melodies over chord progressions, then to create new melodies over arbitrary chord progressions from the resulting connectome (representation of neural network structure). Our architecture consists of two sub-networks, the interval expert and the chord expert, each being LSTM (long short-term memory) recur...
متن کاملDiffusion Networks, Product of Experts, and Factor Analysis
Hinton (in press) recently proposed a learning algorithm called contrastive divergence learning for a class of probabilistic models called product of experts (PoE). Whereas in standard mixture models the “beliefs” of individual experts are averaged, in PoEs the “beliefs” are multiplied together and then renormalized. One advantage of this approach is that the combined beliefs can be much sharpe...
متن کاملذخیره در منابع من
با ذخیره ی این منبع در منابع من، دسترسی به آن را برای استفاده های بعدی آسان تر کنید
ژورنال
عنوان ژورنال: Scholarpedia
سال: 2007
ISSN: 1941-6016
DOI: 10.4249/scholarpedia.3879